Analogy and duality between random channel coding and lossy source coding

نویسندگان

  • Sergey Tridenski
  • Ram Zamir
چکیده

Here we write in a unified fashion (using “R(P,Q,D)” [1]) the random coding exponents in channel coding and lossy source coding. We derive their explicit forms and show, that, for a given random codebook distribution Q, the channel decoding error exponent can be viewed as an encoding success exponent in lossy source coding, and the channel correct-decoding exponent can be viewed as an encoding failure exponent in lossy source coding. We then extend the channel exponents to arbitrary D, which corresponds for D > 0 to erasure decoding and for D < 0 to list decoding. For comparison, we also derive the exact random coding exponent for Forney’s optimum tradeoff decoder [2]. In the case of source coding, we assume discrete memoryless sources with a finite alphabet X and a finite reproduction alphabet X̂ . In the case of channel coding, we assume discrete memoryless channels with finite input and output alphabets X and Y , such that for any (x, y) ∈ X ×Y the channel probability is positive P (y | x) > 0. For simplicity, let R denote an exponential size of a random codebook, such that there exist block lengths n for which e is integer. We assume the size of the codebook M = e for source coding, and M = e + 1 for channel coding. Let Q denote the (i.i.d.) distribution, according to which the codebook is generated. We use also the definition:

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On privacy amplification, lossy compression, and their duality to channel coding

We examine the task of privacy amplification from information-theoretic and coding-theoretic points of view. In the former, we give a one-shot characterization of the optimal rate of privacy amplification against classical adversaries in terms of the optimal type-II error in asymmetric hypothesis testing. The converse bound turns out to be tighter than previous bounds based on smooth min-entrop...

متن کامل

The Achievable Rate-Distortion Region for Distributed Source Coding with One Distortion Criterion and Correlated Messages

In this paper, distributed (or multiterminal) source coding with one distortion criterion and correlated messages is considered. This problem can be also called “Berger-Yeung problem with correlated messages”. It corresponds to the source coding part of the graph-based framework for transmission of a pair of correlated sources over the multiple-access channel (MAC) where one is lossless and the...

متن کامل

Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy-Power

Lossy source coding under the mean-squared error fidelity criterion is considered. The rate-distortion function can be expressed in closed form only for very special cases, including Gaussian sources. The classical upper and lower bounds look exactly alike, except that the upper bound has the source power (variance) whereas the lower bound has the source entropypower. This pleasing duality of p...

متن کامل

Superposition Coding for Source Coding with Side Information at the Decoder

Problems of noisy channel coding with side information at the encoder and of lossy source coding with side information at the decoder have many practical applications. Unfortunately, the theoretical performance bounds are obtained by randomly partitioning a randomly generated source or channel code into an ensemble of channel or source codes, which is highly non-practical. A scheme has recently...

متن کامل

On Lossless Quantum Data Compression and Quan- tum Variable–Length Codes

In Shannon’s Foundation of Information Theory ([27]) perhaps the most basic contributions are Source Coding Theorems (lossy and lossless) and the Channel Coding Theorem. In the most natural and simple source model DMS the source outputs a sequence X1, X2, . . . of independent, identically distributed random variables taking finitely many values. The Lossy Source Coding Theorem says that this se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1701.07707  شماره 

صفحات  -

تاریخ انتشار 2017